1,947 research outputs found

    Imaging of the Lamina Cribrosa using Swept-Source Optical Coherence Tomography.

    Get PDF
    The lamina cribrosa (LC) is the presumed site of axonal injury in glaucoma. Its deformation has been suggested to contribute to optic neuropathy by impeding axoplasmic flow within the optic nerve fibers, leading to apoptosis of retinal ganglion cells. To visualize the LC in vivo, optical coherence tomography (OCT) has been applied. Spectral domain (SD)-OCT, used in conjunction with recently introduced enhanced depth imaging (EDI)-OCT, has improved visualization of deeper ocular layers, but in many individuals it is still limited by inadequate resolution, poor image contrast and insufficient depth penetrance. The posterior laminar surface especially is not viewed clearly using these methods. New generation high-penetration (HP)-OCTs, also known as swept-source (SS)-OCT, are capable to evaluate the choroid in vivo to a remarkable level of detail. SS-OCTs use a longer wavelength (1,050 nm instead of 840 nm) compared to the conventional techniques. We review current knowledge of the LC, findings from trials that use SD-OCT and EDI-OCT, and our experience with a prototype SS-OCT to visualize the LC in its entirety. Key Points What is known? •     The LC is the presumed site of axonal injury in glaucoma •     Compared to spectral domain-OCT, enhanced depth imaging-OCT improves imaging of the LC •     Even so, currently used SD-OCT techniques are restricted by poor wavelength penetrance of the deeper ocular layers What our findings add? •    SS-OCT may be a superior imaging modality for deep ocular structures •    Prior studies used SS-OCT to evaluate choroidal thickness in both healthy and 'normal tension glaucoma' eyes •    SS-OCT enables good evaluation of three-dimension (3D) lamina cribrosa morphology. How to cite this article: Nuyen B, Mansouri K, Weinreb RN. Imaging of the Lamina Cribrosa using Swept-Source Optical Coherence Tomography. J Current Glau Prac 2012;6(3): 113-119

    Pedestrian route taking behaviour at night and street lighting: A pilot study

    Get PDF

    The synergistic effect of operational research and big data analytics in greening container terminal operations: a review and future directions

    Get PDF
    Container Terminals (CTs) are continuously presented with highly interrelated, complex, and uncertain planning tasks. The ever-increasing intensity of operations at CTs in recent years has also resulted in increasing environmental concerns, and they are experiencing an unprecedented pressure to lower their emissions. Operational Research (OR), as a key player in the optimisation of the complex decision problems that arise from the quay and land side operations at CTs, has been therefore presented with new challenges and opportunities to incorporate environmental considerations into decision making and better utilise the ‘big data’ that is continuously generated from the never-stopping operations at CTs. The state-of-the-art literature on OR's incorporation of environmental considerations and its interplay with Big Data Analytics (BDA) is, however, still very much underdeveloped, fragmented, and divergent, and a guiding framework is completely missing. This paper presents a review of the most relevant developments in the field and sheds light on promising research opportunities for the better exploitation of the synergistic effect of the two disciplines in addressing CT operational problems, while incorporating uncertainty and environmental concerns efficiently. The paper finds that while OR has thus far contributed to improving the environmental performance of CTs (rather implicitly), this can be much further stepped up with more explicit incorporation of environmental considerations and better exploitation of BDA predictive modelling capabilities. New interdisciplinary research at the intersection of conventional CT optimisation problems, energy management and sizing, and net-zero technology and energy vectors adoption is also presented as a prominent line of future research

    Neural Predictive Control of Unknown Chaotic Systems

    Get PDF
    In this work, a neural networks is developed for modelling and controlling a chaotic system based on measured input-output data pairs. In the chaos modelling phase, a neural network is trained on the unknown system. Then, a predictive control mechanism has been implemented with the neural networks to reach the close neighborhood of the chosen unstable fixed point embedded in the chaotic systems. Effectiveness of the proposed method for both modelling and prediction-based control on the chaotic logistic equation and Hénon map has been demonstrated

    Tests of relativity using a microwave resonator

    Get PDF
    The frequencies of a cryogenic sapphire oscillator and a hydrogen maser are compared to set new constraints on a possible violation of Lorentz invariance. We determine the variation of the oscillator frequency as a function of its orientation (Michelson-Morley test) and of its velocity (Kennedy-Thorndike test) with respect to a preferred frame candidate. We constrain the corresponding parameters of the Mansouri and Sexl test theory to δ−β+1/2=(1.5±4.2)×10−9\delta - \beta + 1/2 = (1.5\pm 4.2) \times 10^{-9} and β−α−1=(−3.1±6.9)×10−7\beta - \alpha - 1 = (-3.1\pm 6.9) \times 10^{-7} which is equivalent to the best previous result for the former and represents a 30 fold improvement for the latter.Comment: 8 pages, 2 figures, submitted to Physical Review Letters (October 3, 2002

    The effect of 12 weeks Anethum graveolens (dill) on metabolic markers in patients with metabolic syndrome; A randomized double blind controlled trial

    Get PDF
    Background: The clustering of metabolic abnormalities defined as metabolic syndrome is now both a public health and a clinical problem .While interest in herbal medicine has greatly increased, lack of human evidence to support efficacies shown in animals does exist. This clinical trial study designed to investigate whether herbal medicine, Anethum graveolens (dill) extract, could improve metabolic components in patients with metabolic syndrome. Methods: A double-blind, randomized, placebo-controlled trial using a parallel design was conducted. 24 subjects who had metabolic syndrome diagnostic criteria (update of ATP III) were randomly assigned to either dill extract (n = 12) or placebo (n = 12) for 3 months. Results: Across lipid component of metabolic syndrome, no significant differences in triglyceride (TG) concentration and high density lipoprotein cholesterol were seen between the two groups. However TG improved significantly from baseline (257.0 vs. 201.5p = 0.01) with dill treatment but such a significant effect was not observed in placebo group. Moreover, no significant differences in waist circumference, blood pressure and fasting blood sugar were seen between two groups after 3 months follow up period. Conclusion: In this small clinical trial in patients with metabolic syndrome, 12 weeks of dill extract treatment had a beneficial effect in terms of reducing TG from baseline. However dill treatment was not associated with a significant improvement in metabolic syndrome related markers compared to control group. Larger studies might be required to prove the efficacy and safety of long-Term administration of dill to resolve metabolic syndrome components. © 2012 Mansouri et al.; licensee BioMed Central Ltd

    An Evolutionary Approach to Load Balancing Parallel Computations

    Get PDF
    We present a new approach to balancing the workload in a multicomputer when the problem is decomposed into subproblems mapped to the processors. It is based on a hybrid genetic algorithm. A number of design choices for genetic algorithms are combined in order to ameliorate the problem of premature convergence that is often encountered in the implementation of classical genetic algorithms. The algorithm is hybridized by including a hill climbing procedure which significantly improves the efficiency of the evolution. Moreover, it makes use of problem specific information to evade some computational costs and to reinforce favorable aspects of the genetic search at some appropriate points. The experimental results show that the hybrid genetic algorithm can find solutions within 3% of the optimum in a reasonable time. They also suggest that this approach is not biased towards particular problem structures

    An Approach for Minimizing Spurious Errors in Testing ADA Tasking Programs

    Get PDF
    We propose an approach for detecting deadlocks and race conditions in Ada tasking software. It is based on an extension to Petri net-based techniques, where a concurrent program is modeled as a Petri net and a reachability graph is then derived and analyzed for desired information. In this approach, Predicate-Action subnets representing Ada programming constructs are described, where predicates and actions are attached to transitions. Predicates are those found in decision statements. Actions involve updating the status of the variables that affect the tasking behavior of the program and updating the Read and Write sets of shared variables. The shared variables are those occurring in sections of the program, called concurrency zones, related to the transitions. Modeling of a tasking program is accomplished by using the basic subnets as building blocks in translating only tasking-related statements and connecting them to produce the total Predicate-Action net model augmented with sets of shared variables. An augmented reachability graph is then derived by executing the net model. Deadlocks and race conditions are detected by searching the nodes of this graph. The main advantage offered by this approach is that the Predicate-Action extension of the net leads to pruning infeasible paths in the reachability graph and, thus, reducing the spurious error reports encountered in previous approaches. Also, this approach enables a partial handling of loops in a practical way. Implementation issues are also discussed in the paper

    Parallel Genetic Algorithms with Application to Load Balancing for Parallel Computing

    Get PDF
    A new coarse grain parallel genetic algorithm (PGA) and a new implementation of a data-parallel GA are presented in this paper. They are based on models of natural evolution in which the population is formed of discontinuous or continuous subpopulations. In addition to simulating natural evolution, the intrinsic parallelism in the two PGA\u27s minimizes the possibility of premature convergence that the implementation of classic GA\u27s often encounters. Intrinsic parallelism also allows the evolution of fit genotypes in a smaller number of generations in the PGA\u27s than in sequential GA\u27s, leading to superlinear speed-ups. The PGA\u27s have been implemented on a hypercube and a Connection Machine, and their operation is demonstrated by applying them to the load balancing problem in parallel computing. The PGA\u27s have found near-optimal solutions which are comparable to the solutions of a simulated annealing algorithm and are better than those produced by a sequential GA and by other load balancing methods. On one hand, The PGA\u27s accentuate the advantage of parallel computers for simulating natural evolution. On the other hand, they represent new techniques for load balancing parallel computations
    • …
    corecore